30 research outputs found

    Concatenated codes with convolutional inner codes

    Get PDF

    Nonequivalent cascaded convolutional codes obtained from equivalent constituent convolutional encoders

    Get PDF
    Cascaded convolutional codes with conventional convolutional codes as constituent codes are powerful and attractive to use in communication systems where very low error probabilities are needed. This paper clearly demonstrates the dramatic effect the replacement of the inner convolutional encoder by an equivalent one could have on the cascaded convolutional code

    Some distance properties of tailbiting codes

    Get PDF
    The active tailbiting segment distance for convolutional codes is introduced. Together with the earlier defined active burst distance, it describes the error correcting capability of a tailbiting code encoded by a convolutional encoder. Lower bounds on the new active distance as well as an upper bound on the ratio between tailbiting length and memory of the encoder such that its minimu

    Active distances and cascaded convolutional codes

    Get PDF
    A family of active distances for convolutional codes is introduced. Lower bounds are derived for the ensemble of periodically time-varying convolutional codes

    Interleaver Design for Turbo Coding

    No full text

    Achieving unequal error protection via woven codes: Construction and analysis

    No full text
    In this paper several rate R = 2/5 convolutional encoding matrices with four states are used to illustrate how unequal error protection can be achieved for the different positions in the code sequences as well as in the information sequences. Free output- and input-distances and active burst output- and input-distances are computed. Typically a single code cannot combine a large free output- or input-distance with a steep slope. A constructive way to obtain powerful "unequal" coding schemes that provide both large distances and steep slopes is to combine several constituent encoders. Various woven schemes are studied and lower bounds on the active burst output- and input-distances are derived

    A distance measure tailored to tailbiting codes

    No full text
    The error-correcting capability of tailbiting codes generated by convolutional encoders is described. In order to obtain a description beyond what the minimum distance d min of the tailbiting code implies, the active tailbiting segment distance is introduced. The description of correctable error patterns via active distances leads to an upper bound on the decoding block error probability of tailbiting codes. The necessary length of a tailbiting code so that its minimum distance is equal to the free distance d free of the convolutional code encoded by the same encoder is easily obtained from the active tailbiting segment distance. This is useful when designing and analyzing concatenated convolutional codes with component codes that are terminated using the tailbiting method. Lower bounds on the active tailbiting segment distance and an upper bound on the ratio between the tailbiting length and memory of the convolutional generator matrix such that d min equals d free are derived. Furthermore, affine lower bounds on the active tailbiting segment distance suggest that good tailbiting codes are generated by convolutional encoders with large active-distance slopes

    Encoder and distance properties of woven convolutional codes with one tailbiting component code

    No full text
    Woven convolutional codes with one tailbiting component code are studied and their generator matrices are given. It is shown that, if the constituent encoders are identical, a woven convolutional encoder with an outer convolutional warp and one inner tailbiting encoder (WIT) generates the same code as a woven convolutional encoder with one outer tailbiting encoder and an inner convolutional warp (WOT). However, for rate R tb < 1 tailbiting encoders, the WOT cannot be an encoder realization with a minimum number of delay elements. Lower bounds on the free distance and active distances of woven convolutional codes with a tailbiting component code are given. These bounds are equal to those for woven codes consisting exclusively of unterminated convolutional codes. However, for woven convolutional codes with one tailbiting component code, the conditions for the bounds to hold are less strict
    corecore